Generated for model: models/resnet50_prune30pct_best_model.pth
----------------------------------------------------------------
Layer (type) Output Shape Param #
================================================================
Conv2d-1 [-1, 64, 112, 112] 9,408
BatchNorm2d-2 [-1, 64, 112, 112] 128
MaxPool2d-3 [-1, 64, 56, 56] 0
Conv2d-4 [-1, 64, 56, 56] 4,096
BatchNorm2d-5 [-1, 64, 56, 56] 128
Conv2d-6 [-1, 64, 56, 56] 36,864
BatchNorm2d-7 [-1, 64, 56, 56] 128
Conv2d-8 [-1, 256, 56, 56] 16,384
BatchNorm2d-9 [-1, 256, 56, 56] 512
Conv2d-10 [-1, 256, 56, 56] 16,384
BatchNorm2d-11 [-1, 256, 56, 56] 512
Bottleneck-12 [-1, 256, 56, 56] 0
Conv2d-13 [-1, 64, 56, 56] 16,384
BatchNorm2d-14 [-1, 64, 56, 56] 128
Conv2d-15 [-1, 64, 56, 56] 36,864
BatchNorm2d-16 [-1, 64, 56, 56] 128
Conv2d-17 [-1, 256, 56, 56] 16,384
BatchNorm2d-18 [-1, 256, 56, 56] 512
Bottleneck-19 [-1, 256, 56, 56] 0
Conv2d-20 [-1, 64, 56, 56] 16,384
BatchNorm2d-21 [-1, 64, 56, 56] 128
Conv2d-22 [-1, 64, 56, 56] 36,864
BatchNorm2d-23 [-1, 64, 56, 56] 128
Conv2d-24 [-1, 256, 56, 56] 16,384
BatchNorm2d-25 [-1, 256, 56, 56] 512
Bottleneck-26 [-1, 256, 56, 56] 0
Conv2d-27 [-1, 128, 56, 56] 32,768
BatchNorm2d-28 [-1, 128, 56, 56] 256
Conv2d-29 [-1, 128, 28, 28] 147,456
BatchNorm2d-30 [-1, 128, 28, 28] 256
Conv2d-31 [-1, 512, 28, 28] 65,536
BatchNorm2d-32 [-1, 512, 28, 28] 1,024
Conv2d-33 [-1, 512, 28, 28] 131,072
BatchNorm2d-34 [-1, 512, 28, 28] 1,024
Bottleneck-35 [-1, 512, 28, 28] 0
Conv2d-36 [-1, 128, 28, 28] 65,536
BatchNorm2d-37 [-1, 128, 28, 28] 256
Conv2d-38 [-1, 128, 28, 28] 147,456
BatchNorm2d-39 [-1, 128, 28, 28] 256
Conv2d-40 [-1, 512, 28, 28] 65,536
BatchNorm2d-41 [-1, 512, 28, 28] 1,024
Bottleneck-42 [-1, 512, 28, 28] 0
Conv2d-43 [-1, 128, 28, 28] 65,536
BatchNorm2d-44 [-1, 128, 28, 28] 256
Conv2d-45 [-1, 128, 28, 28] 147,456
BatchNorm2d-46 [-1, 128, 28, 28] 256
Conv2d-47 [-1, 512, 28, 28] 65,536
BatchNorm2d-48 [-1, 512, 28, 28] 1,024
Bottleneck-49 [-1, 512, 28, 28] 0
Conv2d-50 [-1, 128, 28, 28] 65,536
BatchNorm2d-51 [-1, 128, 28, 28] 256
Conv2d-52 [-1, 128, 28, 28] 147,456
BatchNorm2d-53 [-1, 128, 28, 28] 256
Conv2d-54 [-1, 512, 28, 28] 65,536
BatchNorm2d-55 [-1, 512, 28, 28] 1,024
Bottleneck-56 [-1, 512, 28, 28] 0
Conv2d-57 [-1, 256, 28, 28] 131,072
BatchNorm2d-58 [-1, 256, 28, 28] 512
Conv2d-59 [-1, 256, 14, 14] 589,824
BatchNorm2d-60 [-1, 256, 14, 14] 512
Conv2d-61 [-1, 1024, 14, 14] 262,144
BatchNorm2d-62 [-1, 1024, 14, 14] 2,048
Conv2d-63 [-1, 1024, 14, 14] 524,288
BatchNorm2d-64 [-1, 1024, 14, 14] 2,048
Bottleneck-65 [-1, 1024, 14, 14] 0
Conv2d-66 [-1, 256, 14, 14] 262,144
BatchNorm2d-67 [-1, 256, 14, 14] 512
Conv2d-68 [-1, 256, 14, 14] 589,824
BatchNorm2d-69 [-1, 256, 14, 14] 512
Conv2d-70 [-1, 1024, 14, 14] 262,144
BatchNorm2d-71 [-1, 1024, 14, 14] 2,048
Bottleneck-72 [-1, 1024, 14, 14] 0
Conv2d-73 [-1, 256, 14, 14] 262,144
BatchNorm2d-74 [-1, 256, 14, 14] 512
Conv2d-75 [-1, 256, 14, 14] 589,824
BatchNorm2d-76 [-1, 256, 14, 14] 512
Conv2d-77 [-1, 1024, 14, 14] 262,144
BatchNorm2d-78 [-1, 1024, 14, 14] 2,048
Bottleneck-79 [-1, 1024, 14, 14] 0
Conv2d-80 [-1, 256, 14, 14] 262,144
BatchNorm2d-81 [-1, 256, 14, 14] 512
Conv2d-82 [-1, 256, 14, 14] 589,824
BatchNorm2d-83 [-1, 256, 14, 14] 512
Conv2d-84 [-1, 1024, 14, 14] 262,144
BatchNorm2d-85 [-1, 1024, 14, 14] 2,048
Bottleneck-86 [-1, 1024, 14, 14] 0
Conv2d-87 [-1, 256, 14, 14] 262,144
BatchNorm2d-88 [-1, 256, 14, 14] 512
Conv2d-89 [-1, 256, 14, 14] 589,824
BatchNorm2d-90 [-1, 256, 14, 14] 512
Conv2d-91 [-1, 1024, 14, 14] 262,144
BatchNorm2d-92 [-1, 1024, 14, 14] 2,048
Bottleneck-93 [-1, 1024, 14, 14] 0
Conv2d-94 [-1, 256, 14, 14] 262,144
BatchNorm2d-95 [-1, 256, 14, 14] 512
Conv2d-96 [-1, 256, 14, 14] 589,824
BatchNorm2d-97 [-1, 256, 14, 14] 512
Conv2d-98 [-1, 1024, 14, 14] 262,144
BatchNorm2d-99 [-1, 1024, 14, 14] 2,048
Bottleneck-100 [-1, 1024, 14, 14] 0
Conv2d-101 [-1, 512, 14, 14] 524,288
BatchNorm2d-102 [-1, 512, 14, 14] 1,024
Conv2d-103 [-1, 512, 7, 7] 2,359,296
BatchNorm2d-104 [-1, 512, 7, 7] 1,024
Conv2d-105 [-1, 2048, 7, 7] 1,048,576
BatchNorm2d-106 [-1, 2048, 7, 7] 4,096
Conv2d-107 [-1, 2048, 7, 7] 2,097,152
BatchNorm2d-108 [-1, 2048, 7, 7] 4,096
Bottleneck-109 [-1, 2048, 7, 7] 0
Conv2d-110 [-1, 512, 7, 7] 1,048,576
BatchNorm2d-111 [-1, 512, 7, 7] 1,024
Conv2d-112 [-1, 512, 7, 7] 2,359,296
BatchNorm2d-113 [-1, 512, 7, 7] 1,024
Conv2d-114 [-1, 2048, 7, 7] 1,048,576
BatchNorm2d-115 [-1, 2048, 7, 7] 4,096
Bottleneck-116 [-1, 2048, 7, 7] 0
Conv2d-117 [-1, 512, 7, 7] 1,048,576
BatchNorm2d-118 [-1, 512, 7, 7] 1,024
Conv2d-119 [-1, 512, 7, 7] 2,359,296
BatchNorm2d-120 [-1, 512, 7, 7] 1,024
Conv2d-121 [-1, 2048, 7, 7] 1,048,576
BatchNorm2d-122 [-1, 2048, 7, 7] 4,096
Bottleneck-123 [-1, 2048, 7, 7] 0
AdaptiveAvgPool2d-124 [-1, 2048, 1, 1] 0
Linear-125 [-1, 1] 2,049
================================================================
Total params: 23,510,081
Trainable params: 23,510,081
Non-trainable params: 0
----------------------------------------------------------------
Input size (MB): 0.57
Forward/backward pass size (MB): 213.24
Params size (MB): 89.68
Estimated Total Size (MB): 303.50
----------------------------------------------------------------
======================================================== --- Custom Model Pruning Summary (Sparsity Analysis) --- ======================================================== Layer Name | Total Params | Non-Zero Params | Sparsity (%) ------------------------------------------------------------------------------------------------ conv1 | 9,408 | 8,314 | 11.63% layer1.0.conv1 | 4,096 | 3,763 | 8.13% layer1.0.conv2 | 36,864 | 28,312 | 23.20% layer1.0.conv3 | 16,384 | 15,020 | 8.33% layer1.0.shortcut.0 | 16,384 | 15,005 | 8.42% layer1.1.conv1 | 16,384 | 13,753 | 16.06% layer1.1.conv2 | 36,864 | 28,507 | 22.67% layer1.1.conv3 | 16,384 | 15,026 | 8.29% layer1.2.conv1 | 16,384 | 13,730 | 16.20% layer1.2.conv2 | 36,864 | 28,437 | 22.86% layer1.2.conv3 | 16,384 | 14,993 | 8.49% layer2.0.conv1 | 32,768 | 27,578 | 15.84% layer2.0.conv2 | 147,456 | 104,856 | 28.89% layer2.0.conv3 | 65,536 | 57,913 | 11.63% layer2.0.shortcut.0 | 131,072 | 109,392 | 16.54% layer2.1.conv1 | 65,536 | 51,142 | 21.96% layer2.1.conv2 | 147,456 | 104,975 | 28.81% layer2.1.conv3 | 65,536 | 57,920 | 11.62% layer2.2.conv1 | 65,536 | 51,314 | 21.70% layer2.2.conv2 | 147,456 | 105,103 | 28.72% layer2.2.conv3 | 65,536 | 57,919 | 11.62% layer2.3.conv1 | 65,536 | 51,148 | 21.95% layer2.3.conv2 | 147,456 | 105,035 | 28.77% layer2.3.conv3 | 65,536 | 58,089 | 11.36% layer3.0.conv1 | 131,072 | 102,631 | 21.70% layer3.0.conv2 | 589,824 | 394,887 | 33.05% layer3.0.conv3 | 262,144 | 219,716 | 16.18% layer3.0.shortcut.0 | 524,288 | 407,216 | 22.33% layer3.1.conv1 | 262,144 | 189,633 | 27.66% layer3.1.conv2 | 589,824 | 394,391 | 33.13% layer3.1.conv3 | 262,144 | 219,912 | 16.11% layer3.2.conv1 | 262,144 | 188,928 | 27.93% layer3.2.conv2 | 589,824 | 392,750 | 33.41% layer3.2.conv3 | 262,144 | 220,235 | 15.99% layer3.3.conv1 | 262,144 | 188,652 | 28.03% layer3.3.conv2 | 589,824 | 392,037 | 33.53% layer3.3.conv3 | 262,144 | 219,955 | 16.09% layer3.4.conv1 | 262,144 | 188,315 | 28.16% layer3.4.conv2 | 589,824 | 390,009 | 33.88% layer3.4.conv3 | 262,144 | 220,349 | 15.94% layer3.5.conv1 | 262,144 | 188,044 | 28.27% layer3.5.conv2 | 589,824 | 388,661 | 34.11% layer3.5.conv3 | 262,144 | 219,810 | 16.15% layer4.0.conv1 | 524,288 | 375,520 | 28.38% layer4.0.conv2 | 2,359,296 | 1,481,960 | 37.19% layer4.0.conv3 | 1,048,576 | 817,039 | 22.08% layer4.0.shortcut.0 | 2,097,152 | 1,496,486 | 28.64% layer4.1.conv1 | 1,048,576 | 700,004 | 33.24% layer4.1.conv2 | 2,359,296 | 1,485,802 | 37.02% layer4.1.conv3 | 1,048,576 | 815,871 | 22.19% layer4.2.conv1 | 1,048,576 | 700,838 | 33.16% layer4.2.conv2 | 2,359,296 | 1,478,561 | 37.33% layer4.2.conv3 | 1,048,576 | 812,973 | 22.47% linear | 2,048 | 1,443 | 29.54% ------------------------------------------------------------------------------------------------ TOTAL PRUNABLE | 23,456,960 | 16,419,872 | 30.00% ========================================================
Heatmap from weight matrix for sample layer. White pixel represents pruned (zeroed) weights.
Area Under ROC Curve (AUROC): 0.9986
Visualizing model's attention focus on sample images.
Original
Grad-CAM
Original
Grad-CAM
Original
Grad-CAM
Original
Grad-CAM
Original
Grad-CAM
Original
Grad-CAM
Original
Grad-CAM
Original
Grad-CAM
Original
Grad-CAM
Original
Grad-CAM
Original
Grad-CAM
Original
Grad-CAM
Original
Grad-CAM
Original
Grad-CAM
Original
Grad-CAM
Original
Grad-CAM
Original
Grad-CAM
Original
Grad-CAM
Original
Grad-CAM
Original
Grad-CAM
Original
Grad-CAM
Original
Grad-CAM
Original
Grad-CAM
Original
Grad-CAM
Original
Grad-CAM